Some Inequalities Among New Divergence Measures

نویسنده

  • Inder Jeet Taneja
چکیده

Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber J-divergence. Burbea-Rao [1] Jensen-Shannon divegernce and Taneja [8] arithmetic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discrimination, symmetric χ2−divergence, and triangular discrimination are also known in the literature and are not based on logarithmic expressions. Past years Dragomir et al. [3], Kumar and Johnson [7] and Jain and Srivastava [4] studied different kind of divergence measures. In this paper, we worked with inequalities relating these new measures with the previous know one. An idea of exponential divergence is also developed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information Measures via Copula Functions

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

متن کامل

Refinement Inequalities among Symmetric Divergence Measures

There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s JensenShannon divegernce and Taneja’s arithemtic geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discriminatio...

متن کامل

Sequences of Inequalities Among New Divergence Measures

Inder Jeet Taneja Departamento de Matemática Universidade Federal de Santa Catarina 88.040-900 Florianópolis, SC, Brazil. e-mail: [email protected] http://www.mtm.ufsc.br/∼taneja Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber [5, 6] J-divergence. Sibson-BurbeaRao [1] Jensen-Shannon ...

متن کامل

Sequence of inequalities among fuzzy mean difference divergence measures and their applications

This paper presents a sequence of fuzzy mean difference divergence measures. The validity of these fuzzy mean difference divergence measures is proved axiomatically. In addition, it introduces a sequence of inequalities among some of these fuzzy mean difference divergence measures. The applications of proposed fuzzy mean difference divergence measures in the context of pattern recognition have ...

متن کامل

Nested Inequalities Among Divergence Measures

In this paper we have considered an inequality having 11 divergence measures. Out of them three are logarithmic such as Jeffryes-Kullback-Leiber [4] [5] J-divergence. Burbea-Rao [1] Jensen-Shannon divergence and Taneja [7] arithmetic-geometric mean divergence. The other three are non-logarithmic such as Hellinger discrimination, symmetric χ−divergence, and triangular discrimination. Three more ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1010.0412  شماره 

صفحات  -

تاریخ انتشار 2010